منابع مشابه
Nonlinear backpropagation: doing backpropagation without derivatives of the activation function
The conventional linear backpropagation algorithm is replaced by a nonlinear version, which avoids the necessity for calculating the derivative of the activation function. This may be exploited in hardware realizations of neural processors. In this paper we derive the nonlinear backpropagation algorithms in the framework of recurrent backpropagation and present some numerical simulations of fee...
متن کاملthe relationship between learners critical thinking ability and their performance in the reading sections of the tofel and ielts test
the study reflected in this thesis aims at finding out relationships between critical thinking (ct), and the reading sections of tofel and ielts tests. the study tries to find any relationships between the ct ability of students and their performance on reading tests of tofel and academic ielts. however, no research has ever been conducted to investigate the relationship between ct and the read...
15 صفحه اول1 Backpropagation : The Basic Theory
Since the publication of the PDP volumes in 1986,1 learning by backpropagation has become the most popular method of training neural networks. The reason for the popularity is the underlying simplicity and relative power of the algorithm. Its power derives from the fact that, unlike its precursors, the perceptron learning rule and the Widrow-Hoff learning rule, it can be employed for training n...
متن کاملthe survey of the virtual higher education in iran and the ways of its development and improvement
این پژوهش با هدف "بررسی وضعیت موجود آموزش عالی مجازی در ایران و راههای توسعه و ارتقای آن " و با روش توصیفی-تحلیلی و پیمایشی صورت پذیرفته است. بررسی اسنادو مدارک موجود در زمینه آموزش مجازی نشان داد تعداد دانشجویان و مقاطع تحصیلی و رشته محل های دوره های الکترونیکی چندان مطلوب نبوده و از نظر کیفی نیز وضعیت شاخص خدمات آموزشی اساتید و وضعیت شبکه اینترنت در محیط آموزش مجازی نامطلوب است.
Proximal Backpropagation
We propose proximal backpropagation (ProxProp) as a novel algorithm that takes implicit instead of explicit gradient steps to update the network parameters during neural network training. Our algorithm is motivated by the step size limitation of explicit gradient descent, which poses an impediment for optimization. ProxProp is developed from a general point of view on the backpropagation algori...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Nature Reviews Neuroscience
سال: 2020
ISSN: 1471-003X,1471-0048
DOI: 10.1038/s41583-020-0277-3